1,242 research outputs found

    Multi-objective particle swarm optimization algorithm for multi-step electric load forecasting

    Get PDF
    As energy saving becomes more and more popular, electric load forecasting has played a more and more crucial role in power management systems in the last few years. Because of the real-time characteristic of electricity and the uncertainty change of an electric load, realizing the accuracy and stability of electric load forecasting is a challenging task. Many predecessors have obtained the expected forecasting results by various methods. Considering the stability of time series prediction, a novel combined electric load forecasting, which based on extreme learning machine (ELM), recurrent neural network (RNN), and support vector machines (SVMs), was proposed. The combined model first uses three neural networks to forecast the electric load data separately considering that the single model has inevitable disadvantages, the combined model applies the multi-objective particle swarm optimization algorithm (MOPSO) to optimize the parameters. In order to verify the capacity of the proposed combined model, 1-step, 2-step, and 3-step are used to forecast the electric load data of three Australian states, including New South Wales, Queensland, and Victoria. The experimental results intuitively indicate that for these three datasets, the combined model outperforms all three individual models used for comparison, which demonstrates its superior capability in terms of accuracy and stability

    Effect of Schedule Compression on Project Effort

    Get PDF
    Schedule pressure is often faced by project managers and software developers who want to quickly deploy information systems. Typical strategies to compress project time scales might include adding more staff/personnel, investing in development tools, improving hardware, or improving development methods. The tradeoff between cost, schedule, and performance is one of the most important analyses performed during the planning stages of software development projects. In order to adequately compare the effects of these three constraints on the project it is essential to understand their individual influence on the project’s outcome. In this paper, we present an investigation into the effect of schedule compression on software project development effort and cost and show that people are generally optimistic when estimating the amount of schedule compression. This paper is divided into three sections. First, we follow the Ideal Effort Multiplier (IEM) analysis on the SCED cost driver of the COCOMO II model. Second, compare the real schedule compression ratio exhibited by 161 industry projects and the ratio represented by the SCED cost driver. Finally, based on the above analysis, a set of newly proposed SCED driver ratings for COCOMO II are introduced which show an improvement of 6% in the model estimating accuracy

    Neighborhood VAR: Efficient estimation of multivariate timeseries with neighborhood information

    Full text link
    In data science, vector autoregression (VAR) models are popular in modeling multivariate time series in the environmental sciences and other applications. However, these models are computationally complex with the number of parameters scaling quadratically with the number of time series. In this work, we propose a so-called neighborhood vector autoregression (NVAR) model to efficiently analyze large-dimensional multivariate time series. We assume that the time series have underlying neighborhood relationships, e.g., spatial or network, among them based on the inherent setting of the problem. When this neighborhood information is available or can be summarized using a distance matrix, we demonstrate that our proposed NVAR method provides a computationally efficient and theoretically sound estimation of model parameters. The performance of the proposed method is compared with other existing approaches in both simulation studies and a real application of stream nitrogen study

    Improving the Design of EQ-5D Value Set Studies for China and Beyond

    Get PDF
    This thesis details in 6 studies studying different aspects of EQ-5D use in China. The population norm study provided the first norm based on urban Chinese self-reported health status, which not only provided insight into HRQoL variations among subgroups but also served as a reference point to quantify disease burden/intervention effect etc. The subsequent methodological studies offered suggestions for improving the design of future valuation studies that may strengthen health technology assessment and cost-utility analysis in China. Since EQ-5D is the most widely-used instrument worldwide for this purpose it is thus a good candidate for such instrument in China. _Chapter 2_ describes a descriptive exercise that was performed to report EQ-5D-5L norm scores in the urban Chinese population. Additional analysis was undertaken to test whether self-reported HRQoL varied between different demographic groups. It was found that HRQoL outcomes did indeed differ over age, gender, education level, health insurance status, employment status, and the residence of origin groups. In _Chapter 3,_ individual level inconsistency in the TTO task was related to factors varied during the interview and it was found that instead of respondents, the interviewer was most vital in reducing individual level inconsistency. The results suggested that the valuation process may have been influenced by potential interviewer effects before the implementation of the Quality Control (QC) tool, which was implemented in later EQ-5D-5L TTO research. Commencing in _Chapter 4_, possible designs to be used for the EQ-5D valuation study were systematically examined and compared. First, in Chapter 4, an EQ-5D-3L saturated dataset was used as a gold standard to compare two oft-mentioned but somewhat conflicting design principles in selecting health states for direct valuation: the commonness of health states (the prevalence) versus statistical efficiency of a design. By simulating the modelling process, it was found that the principle of statistical efficiency outweighed the principle of commonness in achieving sufficient prediction accuracy for non-valued states. This result suggested that the designs for the previous valuation studies were not optimal, and that future valuation studies could use a smaller design if the statistical efficiency of that design was guaranteed. In Chapter 4, the principle of commonness as a proxy for implausibility in health states selection was examined. In _Chapter 5,_ it was reported how university students valued all EQ-5D-5L states and judged the implausibility of each state. The results showed that respondents lacked agreement concerning which states were implausible. As there was no universal implausible state, the mean value of a state from respondents who thought it was implausible was compared with the counterpart from respondents who thought it was plausible. The results showed that values from implausible observations were lower, but still in agreement with values from plausible observations. Learning from design selection experience with EQ-5D-3L, in _Chapter 6_ the aim was to test the current EQ-VT design and to identify a possible smaller design for EQ-5D-5L valuation studies. The good performance in using an orthogonal design was confirmed again with EQ-5D-5L data, i.e. an orthogonal design with 25 states performed equally as well as the EQ-VT design with 86 states in terms of prediction accuracy for all 3,125 states. In _Chapter 7,_ the most efficient TTO data design (the orthogonal) was tested in comparison with the standard EQ-VT design and again was found favorable. In this thesis, attempts were made to understand the possible effects of sample and design choices in previous Chinese valuation studies. The findings of this thesis can also be generalized to other countries’ EQ-5D studies or to valuation studies employing other instruments than EQ-5D
    • …
    corecore